-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Miscellaneous docstring additions and fixes #1998
Miscellaneous docstring additions and fixes #1998
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1998 +/- ##
=======================================
Coverage 87.10% 87.10%
=======================================
Files 20 20
Lines 1528 1528
=======================================
Hits 1331 1331
Misses 197 197
Continue to review full report at Codecov.
|
cc: @DhairyaLGandhi |
de72a12
to
a0a8536
Compare
src/layers/normalise.jl
Outdated
@@ -10,7 +10,7 @@ _dropout_shape(s, dims) = tuple((i ∉ dims ? 1 : si for (i, si) ∈ enumerate(s | |||
_dropout_kernel(y::T, p, q) where {T} = y > p ? T(1 / q) : T(0) | |||
|
|||
""" | |||
dropout([rng = rng_from_array(x)], x, p; dims=:, active=true) | |||
dropout([rng = _rng_from_array(x)], x, p; dims=:, active=true) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe this shouldn't mention this internal function? If you aren't looking at the source, you don't need to know it exists:
dropout([rng = _rng_from_array(x)], x, p; dims=:, active=true) | |
dropout([rng], x, p; dims=:, active=true) |
Alternatively, if it's a function you may need to know about & overload for your x
, then it should be left as rng_from_array
and included in the manual.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting. The docstring does redirect users to the Dropout
construct, so should we be concerned with users using this function?
If users are indeed using the dropout
function, then it makes sense to treat rng_from_array
as a public function and add it to the manual.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The majority of docstrings use the following notation for specifying the default value as rng_from_array
-
[rng=GLOBAL_RNG]
Example -
kaiming_uniform([rng=GLOBAL_RNG], size...; gain = √2) -> Array
kaiming_uniform([rng]; kw...) -> Function
glorot_normal([rng=GLOBAL_RNG], size...; gain = 1) -> Array
glorot_normal([rng]; kw...) -> Function
etc.
I think these should also be changed. Now that I think of it, maybe we should keep it as rng_from_array
and add it to the manual.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One point against mentioning rng_from_array
in the docstring is that these functions don't actually use it! For most cases users will encounter it will behave as if it were, but there are enough edge cases that I'd prefer not to unintentionally mislead about expected behaviour.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, so should I keep it private, and update the docstring as [rng]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I stand corrected, it seems the functions in question were updated very recently to use it. Don't see much hurt in documenting rng_from_array
and just noting it's an internal function, but you may want to wait for others to chime in. Another idea would be to call the zero-arg rng_from_array
something else and document that. The current name is a bit of a misnomer anyways because there is no array involved in that method...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh yes, something like default_rng_value()
? Then maybe add both of them to the manual?
|
||
Return an `Array{Float32}` of the given `size`, filled like `rand` or `randn`. | ||
Return an `Array{Float32}` of the given `size`, filled like `rand`. | ||
When the size is not provided, `rand32(rng::AbstractRNG)` returns a function. | ||
""" | ||
rand32(dims::Integer...) = Base.rand(Float32, dims...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this also call _rng_from_array()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should _rng_from_array()
be added in the following way? -
rand32(dims::Integer...) = Base.rand(_rng_from_array(), Float32, dims...)
5c8b0a8
to
91013ff
Compare
I have renamed |
test/losses.jl
Outdated
@test Flux.tversky_loss(ŷ, y) ≈ -0.06772009029345383 | ||
@test Flux.tversky_loss(ŷ, y, β=0.8) ≈ -0.09490740740740744 | ||
@test Flux.tversky_loss(ŷ, y) ≈ 0.028747433264887046 | ||
@test Flux.tversky_loss(ŷ, y, β=0.8) ≈ 0.050200803212851364 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How did this change come about? It seems very drastic given the inputs have not changed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have tweaked the implementation here - https://github.com/FluxML/Flux.jl/pull/1998/files#diff-0ff18e54a53b74bc46260164231d6265a35569aa9a367cedda9f8baee211df42R534-R538.
See #1993. I'll revert the changes if the original implementation is correct. (Should have dealt with this in a separate PR, my bad.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it would be best to revert for this PR. I'll follow up in #1993.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted!
d03aed3
to
f49ec34
Compare
Thanks @Saransh-cpp ! |
Added missing docstrings (in the manual), updated the existing ones, and added doctests in the following files -
stateless.jl
losses/functions.jl
utils.jl
In addition, there are various miscellaneous docstring fixes, and I have also prepended an
_
to the internal API. This PR should close #1990.This PR also closes #1993, but I am not 100% sure if the current implementation is wrong. I will revert the changes back if everything is correct.
PR Checklist